Distance-based and continuum Fano inequalities with applications to statistical estimation
نویسندگان
چکیده
In this technical note, we give two extensions of the classical Fano inequality in information theory. The first extends Fano’s inequality to the setting of estimation, providing lower bounds on the probability that an estimator of a discrete quantity is within some distance t of the quantity. The second inequality extends our bound to a continuum setting and provides a volume-based bound. We illustrate how these inequalities lead to direct and simple proofs of several statistical minimax lower bounds.
منابع مشابه
Uniformly Consistent Empirical Likelihood Estimation Subject to a Continuum of Unconditional Moment Inequalities
This paper extends moment-based estimation procedures to statistical models defined by a continuum of unconditional moment inequalities. The underlying probability distribution in the model is the (infinite dimensional) parameter of interest. For a general class of estimating functions that indexes the continuum of moments, we develop the estimation theory of this parameter using the method of ...
متن کاملParameter Estimation of Some Archimedean Copulas Based on Minimum Cramér-von-Mises Distance
The purpose of this paper is to introduce a new estimation method for estimating the Archimedean copula dependence parameter in the non-parametric setting. The estimation of the dependence parameter has been selected as the value that minimizes the Cramér-von-Mises distance which measures the distance between Empirical Bernstein Kendall distribution function and true Kendall distribution functi...
متن کاملMoment Inequalities for Supremum of Empirical Processes of U-Statistic Structure and Application to Density Estimation
We derive moment inequalities for the supremum of empirical processes of U-Statistic structure and give application to kernel type density estimation and estimation of the distribution function for functions of observations.
متن کاملInformation Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملLocal Privacy, Data Processing Inequalities, and Statistical Minimax Rates
Working under a model of privacy in which data remains private even from the statistician, we study the tradeoff between privacy guarantees and the utility of the resulting statistical estimators. We prove bounds on information-theoretic quantities, including mutual information and Kullback-Leibler divergence, that depend on the privacy guarantees. When combined with standard minimax techniques...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1311.2669 شماره
صفحات -
تاریخ انتشار 2013